Mixture of Gaussian Processes Based on Bayesian Optimization

نویسندگان

چکیده

This paper gives a detailed introduction of implementing mixture Gaussian process (MGP) model and develops its application for Bayesian optimization (BayesOpt). The also techniques MGP in finding components introduced an alternative gating network based on the Dirichlet distributions. BayesOpt using resultant significantly outperforms one regression terms efficiency test tuning hyperparameters common machine learning algorithms. indicates success methods, implying promising future wider it.

برای دانلود باید عضویت طلایی داشته باشید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Sparse Gaussian Processes for Bayesian Optimization

Bayesian optimization schemes often rely on Gaussian processes (GP). GP models are very flexible, but are known to scale poorly with the number of training points. While several efficient sparse GP models are known, they have limitations when applied in optimization settings. We propose a novel Bayesian optimization framework that uses sparse online Gaussian processes. We introduce a new updati...

متن کامل

Gaussian mixture optimization for HMM based on efficient cross-validation

A Gaussian mixture optimization method is explored using cross-validation likelihood as an objective function instead of the conventional training set likelihood. The optimization is based on reducing the number of mixture components by selecting and merging a pair of Gaussians step by step base on the objective function so as to remove redundant components and improve the generality of the mod...

متن کامل

MiDGaP: Mixture Density Gaussian Processes

Gaussian Processes (GPs) have become a core technique in machine learning over the last decade, with numerous extensions and applications. Although several approaches exist for warping the conditional Gaussian posterior distribution to other members of the exponential family, most tacitly assume a unimodal posterior. In this paper we present a mixture density model (MDM) allowing multi-modal po...

متن کامل

Bayesian Warped Gaussian Processes

Warped Gaussian processes (WGP) [1] model output observations in regression tasks as a parametric nonlinear transformation of a Gaussian process (GP). The use of this nonlinear transformation, which is included as part of the probabilistic model, was shown to enhance performance by providing a better prior model on several data sets. In order to learn its parameters, maximum likelihood was used...

متن کامل

Locally-Biased Bayesian Optimization using Nonstationary Gaussian Processes

Bayesian optimization is becoming a fundamental global optimization algorithm in many applications where sample efficiency is needed, ranging from automatic machine learning, robotics, reinforcement learning, experimental design, simulations, etc. The most popular and effective Bayesian optimization method relies on a stationary Gaussian process as surrogate. In this paper, we present a novel n...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

ژورنال

عنوان ژورنال: Journal of Sensors

سال: 2022

ISSN: ['1687-725X', '1687-7268']

DOI: https://doi.org/10.1155/2022/7646554